Oracle Inequalities and Adaptive Rates
نویسنده
چکیده
We have previously seen how sieve estimators give rise to rates of convergence to the Bayes risk by performing empirical risk minimization over Hk(n), where (Hk)k ≥ 1 is an increasing sequence of sets of classifiers, and k(n) → ∞. However, the rate of convergence depends on k(n). Usually this rate is chosen to minimize the worst-case rate over all distributions of interest. However, it would be nice if we could automatically get a faster rate of convergence when the distribution is more favorable. Since we don’t know whether our distribution is worst-case or not a priori, we don’t know how to choose k(n), and adaptive rates of convergence are not possible with sieve estimators. Adaptive rates are possible, however, using another learning strategy called penalized empirical risk minimization. With this approach we can prove a so-called “oracle inequality” that expresses the ability of penalized ERM to automatically select a classifier of the appropriate complexity so as to achieve improved rates of convergence, even when the best model class Hk depends on some unknown property of the distribution. In these notes we will consider oracle inequalities in the context of dyadic decision trees and of general VC classes. In the latter case, penalized empirical risk minimization is known as structural risk minimization.
منابع مشابه
Adaptive estimation of the intensity of inhomogeneous Poisson processes via concentration inequalities
In this paper, we establish oracle inequalities for penalized projection estimators of the intensity of an inhomogeneous Poisson process. We study consequently the adaptive properties of penalized projection estimators. At first we provide lower bounds for the minimax risk over various sets of smoothness for the intensity and then we prove that our estimators achieve these lower bounds up to so...
متن کاملAdaptive Estimation of and Oracle Inequalities for Probability Densities and Characteristic Functions
The theory of adaptive estimation and oracle inequalities for the case of Gaussian-shift–finite-interval experiments has made significant progress in recent years. In particular, sharp-minimax adaptive estimators and exact exponential-type oracle inequalities have been suggested for a vast set of functions including analytic and Sobolev with any positive index as well as for Efromovich–Pinsker ...
متن کاملOptimal regression rates for SVMs using Gaussian kernels
Support vector machines (SVMs) using Gaussian kernels are one of the standard and state-of-the-art learning algorithms. In this work, we establish new oracle inequalities for such SVMs when applied to either least squares or conditional quantile regression. With the help of these oracle inequalities we then derive learning rates that are (essentially) minmax optimal under standard smoothness as...
متن کاملNonparametric statistical inverse problems
We explain some basic theoretical issues regarding nonparametric statistics applied to inverse problems. Simple examples are used to present classical concepts such as the white noise model, risk estimation, minimax risk, model selection and optimal rates of convergence, as well as more recent concepts such as adaptive estimation, oracle inequalities, modern model selection methods, Stein’s unb...
متن کاملAdaptive Wavelet Estimation: A Block Thresholding And Oracle Inequality Approach
We study wavelet function estimation via the approach of block thresholding and ideal adaptation with oracle. Oracle inequalities are derived and serve as guides for the selection of smoothing parameters. Based on an oracle inequality and motivated by the data compression and localization properties of wavelets, an adaptive wavelet estimator for nonparametric regression is proposed and the opti...
متن کامل